Householder Transformation
   HOME

TheInfoList



OR:

In
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrices. ...
, a Householder transformation (also known as a Householder reflection or elementary reflector) is a
linear transformation In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pre ...
that describes a
reflection Reflection or reflexion may refer to: Science and technology * Reflection (physics), a common wave phenomenon ** Specular reflection, reflection from a smooth surface *** Mirror image, a reflection in a mirror or in water ** Signal reflection, in ...
about a
plane Plane(s) most often refers to: * Aero- or airplane, a powered, fixed-wing aircraft * Plane (geometry), a flat, 2-dimensional surface Plane or planes may also refer to: Biology * Plane (tree) or ''Platanus'', wetland native plant * ''Planes' ...
or
hyperplane In geometry, a hyperplane is a subspace whose dimension is one less than that of its ''ambient space''. For example, if a space is 3-dimensional then its hyperplanes are the 2-dimensional planes, while if the space is 2-dimensional, its hyper ...
containing the origin. The Householder transformation was used in a 1958 paper by
Alston Scott Householder Alston Scott Householder (5 May 1904 – 4 July 1993) was an American mathematician who specialized in mathematical biology and numerical analysis. He is the inventor of the Householder transformation and of Householder's method. Career Househ ...
. Its analogue over general inner product spaces is the Householder operator.


Definition


Transformation

The reflection hyperplane can be defined by its ''normal vector'', a
unit vector In mathematics, a unit vector in a normed vector space is a vector (often a spatial vector) of length 1. A unit vector is often denoted by a lowercase letter with a circumflex, or "hat", as in \hat (pronounced "v-hat"). The term ''direction vecto ...
v (a vector with length 1) that is
orthogonal In mathematics, orthogonality is the generalization of the geometric notion of ''perpendicularity''. By extension, orthogonality is also used to refer to the separation of specific features of a system. The term also has specialized meanings in ...
to the hyperplane. The reflection of a
point Point or points may refer to: Places * Point, Lewis, a peninsula in the Outer Hebrides, Scotland * Point, Texas, a city in Rains County, Texas, United States * Point, the NE tip and a ferry terminal of Lismore, Inner Hebrides, Scotland * Point ...
x about this hyperplane is the
linear transformation In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that pre ...
: : x - 2\langle x, v\rangle v = x - 2v\left(v^\textsf x\right), where v is given as a column unit vector with
Hermitian transpose In mathematics, the conjugate transpose, also known as the Hermitian transpose, of an m \times n complex matrix \boldsymbol is an n \times m matrix obtained by transposing \boldsymbol and applying complex conjugate on each entry (the complex con ...
v^\textsf.


Householder matrix

The matrix constructed from this transformation can be expressed in terms of an
outer product In linear algebra, the outer product of two coordinate vector In linear algebra, a coordinate vector is a representation of a vector as an ordered list of numbers (a tuple) that describes the vector in terms of a particular ordered basis. An ea ...
as: : P = I - 2vv^\textsf is known as the Householder matrix, where I is the
identity matrix In linear algebra, the identity matrix of size n is the n\times n square matrix with ones on the main diagonal and zeros elsewhere. Terminology and notation The identity matrix is often denoted by I_n, or simply by I if the size is immaterial o ...
.


Properties

The Householder matrix has the following properties: * it is
Hermitian {{Short description, none Numerous things are named after the French mathematician Charles Hermite (1822–1901): Hermite * Cubic Hermite spline, a type of third-degree spline * Gauss–Hermite quadrature, an extension of Gaussian quadrature m ...
: P = P^\textsf, * it is
unitary Unitary may refer to: Mathematics * Unitary divisor * Unitary element * Unitary group * Unitary matrix * Unitary morphism * Unitary operator * Unitary transformation * Unitary representation * Unitarity (physics) * ''E''-unitary inverse semigrou ...
: P^ = P^\textsf, * hence it is involutory: P = P^. * A Householder matrix has eigenvalues \pm 1. To see this, notice that if u is orthogonal to the vector v which was used to create the reflector, then Pu = u, i.e., 1 is an eigenvalue of multiplicity n - 1, since there are n - 1 independent vectors orthogonal to v. Also, notice Pv = -v, and so -1 is an eigenvalue with multiplicity 1. * The
determinant In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if and ...
of a Householder reflector is -1, since the determinant of a matrix is the product of its eigenvalues, in this case one of which is -1 with the remainder being 1 (as in the previous point).


Applications


Geometric optics

In geometric optics,
specular reflection Specular reflection, or regular reflection, is the mirror-like reflection of waves, such as light, from a surface. The law of reflection states that a reflected ray of light emerges from the reflecting surface at the same angle to the surf ...
can be expressed in terms of the Householder matrix (see ').


Numerical linear algebra

Householder transformations are widely used in
numerical linear algebra Numerical linear algebra, sometimes called applied linear algebra, is the study of how matrix operations can be used to create computer algorithms which efficiently and accurately provide approximate answers to questions in continuous mathematics. ...
, for example, to annihilate the entries below the main diagonal of a matrix, to perform
QR decomposition In linear algebra, a QR decomposition, also known as a QR factorization or QU factorization, is a decomposition of a matrix ''A'' into a product ''A'' = ''QR'' of an orthogonal matrix ''Q'' and an upper triangular matrix ''R''. QR decompo ...
s and in the first step of the
QR algorithm In numerical linear algebra, the QR algorithm or QR iteration is an eigenvalue algorithm: that is, a procedure to calculate the eigenvalues and eigenvectors of a matrix. The QR algorithm was developed in the late 1950s by John G. F. Francis and by ...
. They are also widely used for transforming to a Hessenberg form. For symmetric or
Hermitian {{Short description, none Numerous things are named after the French mathematician Charles Hermite (1822–1901): Hermite * Cubic Hermite spline, a type of third-degree spline * Gauss–Hermite quadrature, an extension of Gaussian quadrature m ...
matrices, the symmetry can be preserved, resulting in
tridiagonal In linear algebra, a tridiagonal matrix is a band matrix that has nonzero elements only on the main diagonal, the subdiagonal/lower diagonal (the first diagonal below this), and the supradiagonal/upper diagonal (the first diagonal above the main di ...
ization.


QR decomposition

Householder reflections can be used to calculate
QR decomposition In linear algebra, a QR decomposition, also known as a QR factorization or QU factorization, is a decomposition of a matrix ''A'' into a product ''A'' = ''QR'' of an orthogonal matrix ''Q'' and an upper triangular matrix ''R''. QR decompo ...
s by reflecting first one column of a matrix onto a multiple of a standard basis vector, calculating the transformation matrix, multiplying it with the original matrix and then recursing down the (i, i)
minor Minor may refer to: * Minor (law), a person under the age of certain legal activities. ** A person who has not reached the age of majority * Academic minor, a secondary field of study in undergraduate education Music theory *Minor chord ** Barb ...
s of that product.


Tridiagonalization

This procedure is presented in Numerical Analysis by Burden and Faires. It uses a slightly altered \operatorname function with \operatorname(0) = 1. In the first step, to form the Householder matrix in each step we need to determine \alpha and r, which are: :\begin \alpha &= -\operatorname\left(a_\right)\sqrt; \\ r &= \sqrt; \end From \alpha and r, construct vector v: :v^ = \begin v_1 \\ v_2 \\ \vdots \\ v_n \end, where v_1 = 0, v_2 = \frac, and :v_k = \frac for each k = 3, 4 \ldots n Then compute: :\begin P^1 &= I - 2v^ \left(v^\right)^\textsf \\ A^ &= P^1 AP^1 \end Having found P^1 and computed A^ the process is repeated for k = 2, 3, \ldots, n - 2 as follows: :\begin \alpha &= -\operatorname\left(a^k_\right)\sqrt \\ pt r &= \sqrt \\ pt v^k_1 &= v^k_2 = \cdots = v^k_k = 0 \\ pt v^k_ &= \frac \\ v^k_j &= \frac \text j = k + 2,\ k + 3,\ \ldots,\ n \\ P^k &= I - 2v^ \left(v^\right)^\textsf \\ A^ &= P^k A^P^k \end Continuing in this manner, the tridiagonal and symmetric matrix is formed.


Examples

In this example, also from Burden and Faires, the given matrix is transformed to the similar tridiagonal matrix A3 by using the Householder method. : \mathbf = \begin 4 & 1 & -2 & 2 \\ 1 & 2 & 0 & 1 \\ -2 & 0 & 3 & -2 \\ 2 & 1 & -2 & -1 \end, Following those steps in the Householder method, we have: The first Householder matrix: : \begin Q_1 &= \begin 1 & 0 & 0 & 0 \\ 0 & -\frac & \frac & -\frac \\ 0 & \frac & \frac & \frac \\ 0 & -\frac & \frac & \frac \end, \\ A_2 = Q_1 A Q_1 &= \begin 4 & -3 & 0 & 0 \\ -3 & \frac & 1 & \frac \\ 0 & 1 & \frac & -\frac \\ 0 & \frac & -\frac & -1 \end, \end Used A_2 to form : \begin Q_2 &= \begin 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & -\frac & -\frac \\ 0 & 0 & -\frac & \frac \end, \\ A_3 = Q_2 A_2 Q_2 &= \begin 4 & -3 & 0 & 0 \\ -3 & \frac & -\frac & 0 \\ 0 & -\frac & -\frac & \frac \\ 0 & 0 & \frac & \frac \end, \end As we can see, the final result is a tridiagonal symmetric matrix which is similar to the original one. The process is finished after two steps.


Computational and theoretical relationship to other unitary transformations

The Householder transformation is a reflection about a hyperplane with unit normal vector v, as stated earlier. An N-by-N
unitary transformation In mathematics, a unitary transformation is a transformation that preserves the inner product: the inner product of two vectors before the transformation is equal to their inner product after the transformation. Formal definition More precisely, ...
U satisfies UU^\textsf = I. Taking the determinant (N-th power of the geometric mean) and trace (proportional to arithmetic mean) of a unitary matrix reveals that its eigenvalues \lambda_i have unit modulus. This can be seen directly and swiftly: :\begin \frac &= \frac = 1, & \operatorname\left(UU^\textsf\right) &= \prod_^N \left, \lambda_j\^2 = 1. \end Since arithmetic and geometric means are equal if the variables are constant (see
inequality of arithmetic and geometric means In mathematics, the inequality of arithmetic and geometric means, or more briefly the AM–GM inequality, states that the arithmetic mean of a list of non-negative real numbers is greater than or equal to the geometric mean of the same list; and ...
), we establish the claim of unit modulus. For the case of real valued unitary matrices we obtain
orthogonal matrices In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors. One way to express this is Q^\mathrm Q = Q Q^\mathrm = I, where is the transpose of and is the identity ma ...
, UU^\textsf = I. It follows rather readily (see
orthogonal matrix In linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors. One way to express this is Q^\mathrm Q = Q Q^\mathrm = I, where is the transpose of and is the identity ma ...
) that any orthogonal matrix can be decomposed into a product of 2 by 2 rotations, called Givens Rotations, and Householder reflections. This is appealing intuitively since multiplication of a vector by an orthogonal matrix preserves the length of that vector, and rotations and reflections exhaust the set of (real valued) geometric operations that render invariant a vector's length. The Householder transformation was shown to have a one-to-one relationship with the canonical coset decomposition of unitary matrices defined in group theory, which can be used to parametrize unitary operators in a very efficient manner. Finally we note that a single Householder transform, unlike a solitary Givens transform, can act on all columns of a matrix, and as such exhibits the lowest computational cost for QR decomposition and tridiagonalization. The penalty for this "computational optimality" is, of course, that Householder operations cannot be as deeply or efficiently parallelized. As such Householder is preferred for dense matrices on sequential machines, whilst Givens is preferred on sparse matrices, and/or parallel machines.


See also

*
Givens rotation In numerical linear algebra, a Givens rotation is a rotation in the plane spanned by two coordinates axes. Givens rotations are named after Wallace Givens, who introduced them to numerical analysts in the 1950s while he was working at Argonne Nation ...
*
Jacobi rotation In numerical linear algebra, a Jacobi rotation is a rotation, ''Q'k''ℓ, of a 2-dimensional linear subspace of an ''n-''dimensional inner product space, chosen to zero a symmetric pair of off-diagonal entries of an ''n''×''n'' real symmetric ma ...


Notes


References

* * * (Herein Householder Transformation is cited as a top 10 algorithm of this century) * {{Matrix classes Transformation (function) Matrices Numerical linear algebra